Goto

Collaborating Authors

 Sunnyvale









A Decomposable Forward Process in Diffusion Models for Time-Series Forecasting

Caldas, Francisco, Kumar, Sahil, Soares, Cláudia

arXiv.org Machine Learning

We introduce a model-agnostic forward diffusion process for time-series forecasting that decomposes signals into spectral components, preserving structured temporal patterns such as seasonality more effectively than standard diffusion. Unlike prior work that modifies the network architecture or diffuses directly in the frequency domain, our proposed method alters only the diffusion process itself, making it compatible with existing diffusion backbones (e.g., DiffWave, TimeGrad, CSDI). By staging noise injection according to component energy, it maintains high signal-to-noise ratios for dominant frequencies throughout the diffusion trajectory, thereby improving the recoverability of long-term patterns. This strategy enables the model to maintain the signal structure for a longer period in the forward process, leading to improved forecast quality. Across standard forecasting benchmarks, we show that applying spectral decomposition strategies, such as the Fourier or Wavelet transform, consistently improves upon diffusion models using the baseline forward process, with negligible computational overhead. The code for this paper is available at https://anonymous.4open.science/r/D-FDP-4A29.


BanditLP: Large-Scale Stochastic Optimization for Personalized Recommendations

Nguyen, Phuc, Zelditch, Benjamin, Chen, Joyce, Patra, Rohit, Wei, Changshuai

arXiv.org Machine Learning

We present BanditLP, a scalable multi-stakeholder contextual bandit framework that unifies neural Thompson Sampling for learning objective-specific outcomes with a large-scale linear program for constrained action selection at serving time. The methodology is application-agnostic, compatible with arbitrary neural architectures, and deployable at web scale, with an LP solver capable of handling billions of variables. Experiments on public benchmarks and synthetic data show consistent gains over strong baselines. We apply this approach in LinkedIn's email marketing system and demonstrate business win, illustrating the value of integrated exploration and constrained optimization in production.


Tesla used deceptive language to market Autopilot, California judge rules

Engadget

The judge recommends suspending Tesla's sales in the state for 30 days. Tesla, Inc. is an American automotive and clean energy company. Tesla's sales in California should be suspended for 30 days because its marketing around Autopilot and Full Self-Driving misled consumers, a California administrative law judge has ruled . Back in 2022, the California DMV accused the automaker of using deceptive language to advertise those products and making it seem like its vehicles are capable of level 5 autonomous driving. Tesla has since added the word "Supervised" to the name of its Full Self-Driving assistance technology.